# 24B parameter scale
Mistral Small 3.2 24B Instruct 2506 Bf16
Apache-2.0
This is an MLX format model converted from Mistral-Small-3.2-24B-Instruct-2506, suitable for instruction following tasks.
Large Language Model Supports Multiple Languages
M
mlx-community
163
1
Buddyglassuncensored2025.4
This is a merged model based on Mistral-Small-24B-Instruct-2501, utilizing the DARE TIES fusion method to integrate multiple 24B parameter-scale models.
Large Language Model
Transformers

B
darkc0de
52
4
Featured Recommended AI Models